Void Probabilities and Likelihood Approximation for Gibbs Processes
Mathis Rost (Chalmers)
Abstract: When fitting a model to data, one would ideally like to use maximum likelihood estimation, due to its nice statistical properties. Unfortunately, the likelihood function of a general Gibbs point process is typically not tractable, due to the associated normalizing constant. This has led to the development of a range of alternative methods, such as Takacs-Fiksel estimation (including its special case pseudolikelihood estimation) and Point Process Learning. Leveraging recent probabilistic results for Gibbs processes, in this talk we present an approach to perform approximate likelihood estimation for Gibbs processes. Specifically, we show that the likelihood function can be expressed completely in terms of the Papangelou conditional intensity, which is typically known and tractable. This new likelihood representation involves an infinite series expansion, and we discuss different ways of approximating it, and thereby the likelihood function. We further discuss how this plays out in certain models and compare it to the state-of-the-art.
machine learningprobabilitystatistics theory
Audience: researchers in the discipline
Series comments: Gothenburg statistics seminar is open to the interested public, everybody is welcome. It usually takes place in MVL14 (http://maps.chalmers.se/#05137ad7-4d34-45e2-9d14-7f970517e2b60, see specific talk). Speakers are asked to prepare material for 35 minutes excluding questions from the audience.
| Organizers: | Akash Sharma*, Helga Kristín Ólafsdóttir* |
| *contact for this listing |
